翻訳と辞書
Words near each other
・ Line Romance
・ Line Røddik Hansen
・ Line S1 (Milan suburban railway service)
・ Line S1, BCR
・ Line S11 (Milan suburban railway service)
・ Line S13 (Milan suburban railway service)
・ Line S14 (Milan suburban railway service)
・ Line S2 (Milan suburban railway service)
・ Line S2, BCR
・ Line S3 (Milan suburban railway service)
・ Line S4 (Milan suburban railway service)
・ Line S5 (Milan suburban railway service)
・ Line S6 (Milan suburban railway service)
・ Line S8 (Milan suburban railway service)
・ Line S9 (Milan suburban railway service)
Line search
・ Line segment
・ Line segment intersection
・ Line SFM1
・ Line SFM2
・ Line SFM3
・ Line SFM4
・ Line SFMA
・ Line SFMB
・ Line shaft
・ Line sheet
・ Line signaling
・ Line sitting
・ Line Skis
・ Line source


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Line search : ウィキペディア英語版
Line search
In optimization, the line search strategy is one of two basic iterative approaches to find a local minimum \mathbf^
* of an objective function f:\mathbb R^n\to\mathbb R. The other approach is trust region.
The line search approach first finds a descent direction along which the objective function f will be reduced and then computes a step size that determines how far \mathbf should move along that direction. The descent direction can be computed by various methods, such as gradient descent, Newton's method and Quasi-Newton method. The step size can be determined either exactly or inexactly.
==Example use ==

Here is an example gradient method that uses a line search in step 4.
# Set iteration counter \displaystyle k=0, and make an initial guess, \mathbf_0 for the minimum
# Repeat:
#     Compute a descent direction \mathbf_k
#     Choose \displaystyle \alpha_k to 'loosely' minimize h(\alpha)=f(\mathbf_k+\alpha\mathbf_k) over \alpha\in\mathbb R_+
#     Update \mathbf_=\mathbf_k+\alpha_k\mathbf_k, and \displaystyle k=k+1
# Until \|\nabla f(\mathbf_k)\| < tolerance
At the line search step (4) the algorithm might either ''exactly'' minimize ''h'', by solving h'(\alpha_k)=0, or ''loosely'', by asking for a sufficient decrease in ''h''. One example of the former is conjugate gradient method. The latter is called inexact line search and may be performed in a number of ways, such as a backtracking line search or using the Wolfe conditions.
Like other optimization methods, line search may be combined with simulated annealing to allow it to jump over some local minima.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Line search」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.